this is a pan.able
drag and drop horizontally to navigate in this project

  • 0
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
%
preiew print

yōkobo

an object of sensitive presence

Dominique Deuff, Gentiane Venture, Isabelle Milleville-Pennel & Ioana Ocnarescu - March 23, 2023

the original language of this article is english

tags :

layout format :

about this contribution

As part of a multidisciplinary research and a PhD project to strengthen the connection between retired couples living at home, we imagined and designed Yōkobo. It is a robot at the crossroads of a sensitive approach and a robotic trend that bridges the gap between humans (Human-Robot-Human Interactions field).  As a theoretical contribution, Yōkobo is at the intersection of various concepts: behavioral objects, robjects, weak robotics, and slow technology.

Yōkobo is a trinket bowl placed in the entrance of homes. Its discreet presence expresses hospitality and celebrates small moments of everyday life, welcoming visitors and inhabitants of the house. The name comes from the contraction of “yōkoso” (welcome in Japanese) and “robot” (with French pronunciation). In addition to these functions, Yōkobo expresses the state of the home using data from connected IoT devices, combining various house parameters (such as temperature, air quality, etc.) to express the home’s “mood” through its motions. Finally, Yōkobo used in tandem with house keys, can convey a trace, a message based on motion. And a trace is a memory of the partner’s passage.

Yōkobo is resolutely innovative and disruptive. It does not sit within the lineage of the general vision of what robots are and what they can do:

  • it is an object intended to be unobtrusive, stemming from ambient computing, while having an ongoing subtle presence. It does not make sounds, unlike voice assistants and the trend for using voice modality interaction. It expresses its environment only through motion and light.
  • to move away from home’s companion robots and the biases they can generate through facial representation, Yōkobo has neither an anthropomorphic shape nor can talk.
  • Yōkobo is intended to be made of natural materials such as ceramic, wood, or wool to break with the idea of plastic, disposable, and toy robots, and to improve its integration in everyday home life.
  • as a slow technology product, understanding and integrating Yōkobo into one’s life takes time and requires accepting not having a clear, repetitive, and instantaneous response to an action. Its contribution is not measured in terms of efficiency and utility; it is the sum of different experiences with the product over time that creates the object’s meaning and value. Getting to know Yōkobo’s expressive motions is continuous and progressive. Yōkobo is an object that is understood through perception and touches the poetic sensibility of its users.

Yōkobo is a concept that puts people’s relationships at the center. It does not impose itself to propose an exclusive Human-Object relationship. It reveals the presence of the other by expressing the last impermanent trace of the other’s passage. It is an object of sensitive presence.

This work is the result of interdisciplinary research between roboticists, designers, and ergonomists. The navigation (directions and overlay) of this pan.able demonstrates the design and engineering processes, as well as the interaction modalities.

credits

authors: Dominique Deuff, ergonomist and designer, Orange
Gentiane Venture, roboticist, Tokyo University of Agriculture and Technology
Isabelle Milleville-Pennel, cognitive ergonomist, LS2N UMR CNRS 6004
Ioana Ocnarescu, designer, Strate Research, Strate School of Design

with: Enrique Coronado, roboticist, Tokyo University of Agriculture and Technology
Liz Rincon, roboticist, Tokyo University of Agriculture and Technology
Dora Garcin, experience designer, Strate School of Design & Tokyo University of Agriculture and Technology
Corentin Aznar, product designer, Strate School of Design & Tokyo University of Agriculture and Technology
Shohei Hagane, roboticist, Tokyo University of Agriculture and Technology
Simeon Capy, roboticist, Tokyo University of Agriculture and Technology
Pablo Osario, roboticist, Tokyo University of Agriculture and Technology
Remi Dupuis, product designer, Strate School of Design
Dino Beschi, product designer, Strate School of Design & Tokyo University of Agriculture and Technology
Nicolas Pellen, designer, Orange

support:Orange, GV lab, LS2N UMR CNRS 6004, Strate School of Design (Strate Research)

acknowledgements:Nantes Université, Tokyo University of Agriculture and Technology, Strate’s workshop team, Valentina Ramirez Millan

translation: Monique Gross

copy editing: Bronwyn Mahoney

references and rights

illustration rights and references

read more read less

Layer 1

Original drawings, Dora Garcin, 2020. Image credit and graphic transformation: Dominique Deuff, 2021.

 

Layer 2 (images from left to right)

Images 1 to 8: 3D-generated images, Dino Beschi, 2021. Image credit and graphic transformation: Dino Beschi, 2021.

Image 9: photograph, Dominique Deuff, 2021. Photo credit and graphic transformation: Dominique Deuff, 2021.

Images 10 : 3D Model, Nicolas Pellen 2021. 3D-generated images, Clément Laurenziani, 2021. Image credit and graphic transformation: Dominique Deuff, 2021.

Images 11 : 3D model, Nicolas Pellen 2021. 3D-generated images, Nicolas Pellen, 2021. Image credit and graphic transformation: Dominique Deuff, 2021.

Images 12 and 13: drawings, Dominique Deuff, 2021. Image credit and graphic transformation: Dominique Deuff, 2021.

Images 14 and 15: 3D-generated images, Corentin Aznar, 2020. Image credit and graphic transformation: Dominique Deuff, 2021.

Images 16 and 17: drawings, Corentin Aznar, 2021. Image credit and graphic transformation: Dominique Deuff, 2021.

 

Layer 3

Drawings, Dominique Deuff, 2021. Image credit and graphic transformation: Dominique Deuff, 2021.

 

bibliography and references

read more read less

Ashmore, Sondra, and Kristin Runyan. 2014. Introduction to Agile Methods. Upper Saddle River, NJ: Addison-Wesley.

Bevins, Alisha, and Brittany A. Duncan. 2021. “Aerial Flight Paths for Communication: How Participants Perceive and Intend to Respond to Drone Movements.” In HRI ’21: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. New York: Association for Computing Machinery, 16–23, https://doi.org/10.1145/3434073.3444645.

Birmingham, Chris, Zijian Hu, Kartik Mahajan, Eli Reber, and Maja J Matari´c. 2020. “Can I Trust You? A User Study of Robot Mediation of a Support Group.” In 2020 IEEE International Conference on Robotics and Automation (ICRA). New York: IEEE, 8019–8026, https://doi.org/10.1109/ICRA40945.2020.9196875.

Brock, Heike, Selma ˇSabanovi´c, and Randy Gomez. 2021. “Remote You, Haru and Me: Exploring Social Interaction in Telepresence Gaming with a Robotic Agent.” In HRI ’21 Companion: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. New York: United States Association for Computing Machinery, https://doi.org/10.1145/3434074.3447177.

Broers, H. A. T., J. Ham, R. Broeders, R. De Silva, and M. Okada. 2013. “Goal inferences about robot behavior Goal inferences and human response behaviors.” In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). Piscataway, NJ: IEEE, 91–92, https://doi.org/10.1109/HRI.2013.6483516.

Brooke, John. 2013. “SUS: A Retrospective.” Journal of Usability Studies 8 (2): 29–40.

Campa, Riccardo. 2016. “The Rise of Social Robots: A Review of the Recent Literature.” Journal of Evolution & Technology 26 (1), 106–113.

Cannon, Christopher, Kelly Goldsmith, and Caroline Roux. 2019. “A Self-Regulatory Model of Resource Scarcity.” Journal of Consumer Psychology 29 (1): 104–127.

Cao, Zhe, Gines Hidalgo, Tomas Simon, Shih-En Wei, and Yaser Sheikh. 2018. “OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields.” IEEE Transactions on Pattern Analysis & Machine Intelligence 43 (1): 172–186, arXiv: 1812.08008.

Capy, Siméon, Pablo Osorio, Shohei Hagane, Corentin Aznar, Dora Garcin, Enrique Coronado, Dominique Deuff, Ioana Ocnarescu, Isabelle Milleville, Gentiane Venture. 2022. Yōkobo: A Robot to Strengthen Links Amongst Users With Non-Verbal Behaviours, Machines, in process of publication.

Coronado, Enrique, and Gentiane Venture. 2020. “Towards IoT-Aided Human– Robot Interaction Using NEP and ROS: A Platform-Independent, Accessible and Distributed Approach.” Sensors 20 (5): 1500.

Coronado, Enrique, Gentiane Venture, and Natsuki Yamanobe. 2020. “Applying Kansei/Affective Engineering Methodologies in the Design of Social and Service Robots: A Systematic Review.” International Journal of Social Robotics (October): 1–11, https://doi.org/10.1007/s12369-020-00709-x.

Deuff, Dominique, Ioana Ocnarescu, Luis Enrique Coronado, Liz Rincon-Ardila, Isabelle Milleville, and Gentiane Venture. 2020. “Designerly way of thinking in a robotics research project.” Journal of Robotics Society of Japan 38 (8): 692–702.

Deuff, Dominique, Isabelle Milleville-Pennel, Ioana Ocnarescu, Dora Garcin, Corentin Aznar, Simeon Capy, Shohei Hagane, Pablo Osario, Luis Enrique Coronado, Liz Rincon-Ardila and Gentiane Venture. 2022. “Together alone, Yōkobo, a sensible presence robject for the home of newly retired couples.” DIS 2022.

Duarte, Nuno Ferreira, Mirko Raković, Jovica Tasevski, Moreno Ignazio Coco, Aude Billard, and José Santos-Victor. 2018. “Action anticipation: Reading the intentions of humans and robots.” IEEE Robotics and Automation Letters 3 (4) (October): 4132–4139, https://doi.org/10.1109/LRA.2018.2861569.

Erel, Hadas, Yoav Cohen, Klil Shafrir, Sara Daniela Levy, Idan Dov Vidra, Tzachi Shem Tov, and Oren Zuckerman. 2021. “Excluded by Robots: Can Robot-Robot-Human Interaction Lead to Ostracism?” In HRI ’21: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. New York: Association for Computing Machinery, 312–321, https://doi.org/10.1145/3434073.3444648.

Feil-Seifer, David, and Maja J Matarić. 2011. “Socially Assistive Robotics.” IEEE Robotics & Automation Magazine 18 (1) (March): 24–31, https://doi.org/10.1109/MRA.2010.940150.

Georgiev, Aleksandar, and Stephan Schlögl. 2018. “Smart Home Technology: An Exploration of End User Perceptions.” Innovative Lösungen für eine alternde Gesellschaft: Konferenzbeiträge der SMARTER LIVES 18, no. 20.02.

Gomez, Randy, Deborah Szapiro, Kerl Galindo, and Keisuke Nakamura. 2018. “Haru: Hardware Design of an Experimental Tabletop Robot Assistant.” In 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI). New York: Association for Computing Machinery, 233–240.

GROOVE X. https://lovot.life/en/.

Haring, K. S., K. Watanabe, and C. Mougenot. 2013. “The influence of robot appearance on assessment.” In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). New York: Association for Computing Machinery. 131–132, https://doi.org/10.1109/HRI.2013.6483536.

Heenan, Brandon, Saul Greenberg, Setareh Aghel-Manesh, and Ehud Sharlin. 2014. “Designing social greetings in human robot interaction.” In DIS ’14: Proceedings of the 2014 conference on Designing interactive systems. New York: Association for Computing Machinery, 855–864.

Hoffman, Guy. 2012. “Dumb Robots, Smart Phones: A Case Study of Music Listening Companionship.” IEEE International Symposium on Robot and Human Interactive Communication (September): 358–363, https://doi.org/10.1109/ROMAN.2012.6343779.

Hoffman, Guy, and Wendy Ju. 2014. “Designing Robots with Movement in Mind.” Journal of Human-Robot Interaction 3 (1): 91–122.

Intuition Robotics. https://elliq.com.

Knight, Heather. 2011. “Eight Lessons Learned about Non-Verbal Interactions through Robot Theater.” In Social Robotics: Third International Conference, ICSR 2011, Amsterdam, The Netherlands, November 24–25, 2011. Berlin: Springer, 42–51.

Latikka, Rita, Tuuli Turja, and Atte Oksanen. 2019. “Self-Efficacy and Acceptance of Robots.” Computers in Human Behavior 93:157–163.

Lehmann, Hagen, Joan Saez-Pons, Dag Sverre Syrdal, and Kerstin Dautenhahn. 2015. “In Good Company? Perception of Movement Synchrony of a Non-Anthropomorphic Robot.” PloS One 10 (5): e0127747.

Levillain, Florent, and Elisabetta Zibetti. 2017. “Behavioral Objects: The Rise of the Evocative Machines.” Journal of Human-Robot Interaction 6 (1): 4–24.

Li, Dingjun, PL Patrick Rau, and Ye Li. 2010. “A Cross-Cultural Study: Effect of Robot Appearance and Task.” International Journal of Social Robotics 2 (2): 175–186.

Liang, Jun, Deqiang Xian, Xingyu Liu, Jing Fu, Xingting Zhang, Buzhou Tang, Jianbo Lei, et al. 2018. “Usability Study of Mainstream Wearable Fitness Devices: Feature Analysis and System Usability Scale Evaluation.” JMIR mHealth and uHealth 6 (11): e11066.

Luria, Michal, Guy Hoffman, and Oren Zuckerman. 2017. “Comparing Social Robot, Screen and Voice Interfaces for Smart-Home Control.” In CHI ’17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York: The Association for Computing Machinery, 580–628.

Martin, Bella, and Bruce Hanington. 2012. Universal Methods of Design: 100 Ways to Research Complex Problems, Develop Innovative Ideas, and Design Effective Solutions. Beverly, MA: Rockport Publishers.

Miseikis, Justinas, Pietro Caroni, Patricia Duchamp, Alina Gasser, Rastislav Marko, Nelija MIseikien˙e, Frederik Zwilling, Charles de Castelbajac, Lucas Eicher, Michael Fruh, et al. 2020. “Lio-A Personal Robot Assistant for Human-Robot Interaction and Care Applications.” IEEE Robotics and Automation Letters 5 (4): 5339–5346, https://doi.org/10.1109/LRA.2020.3007462.

Mondada, Francesco, Julia Fink, Séverin Lemaignan, David Mansolino, Florian Wille, and Karmen Franinovi´c. 2016. “Ranger, an example of integration of robotics into the home ecosystem.” In New Trends in Medical and Service Robots 38: 181–189, https://doi.org/10.1007/978-3-319-23832-6_15.

Mori, M., K. F. MacDorman, and N. Kageki. 2012. “The Uncanny Valley [From the Field].” IEEE Robotics & Automation Magazine 19 (2): 98–100, https://doi.org/10.1109/MRA.2012.2192811.

Nielsen, Jakob, and Thomas K. Landauer. 1993. “A Mathematical Model of the Finding of Usability Problems.” In CHI ’93: Proceedings of the INTERACT ’93. New York: Association for Computing Machinery, 206–213, https://doi.org/10.1145/169059.169166.

Odom, William T., Abigail J. Sellen, Richard Banks, David S. Kirk, Tim Regan, Mark Selby, Jodi L. Forlizzi, and John Zimmerman. 2014. “Designing for Slowness, Anticipation and Re-Visitation: A Long Term Field Study of the Photobox.” In CHI ’14: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York: Association for Computing Machinery, 1961–1970, https://doi.org/10.1145/2556288.2557178.

Ostrowski, Anastasia K., Vasiliki Zygouras, HaeWon Park, and Cynthia Breazeal. 2021. “Small Group Interactions with Voice-User Interfaces: Exploring Social Embodiment, Rapport, and Engagement.” In HRI ’21: Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. New York: Association for Computing Machinery, 322–331, https://doi.org/10.1145/3434073.3444655

Palaver, Wolfgang. 2013. René Girard’s Mimetic Theory. Translated by Gabriel Borrud. East Lansing: Michigan State University Press.

Paschal, T., M. A. Bell, J. Sperry, S. Sieniewicz, R. J. Wood, and J. C. Weaver. 2019. “Design, Fabrication, and Characterization of an Untethered Amphibious Sea Urchin-Inspired Robot.” IEEE Robotics and Automation Letters 4 (4): 3348–3354, https://doi.org/10.1109/LRA.2019.2926683.

Ramirez Milan, Valentina, Dominique Deuff, and Gentiane Venture. Forthcoming. “Egg shaped, white and emotional robots.” Journal of Intelligent & Robotic Systems.

Scassellati, Brian, Laura Boccanfuso, Chien-Ming Huang, Marilena Mademtzi, Meiying Qin, Nicole Salomons, Pamela Ventola, and Frederick Shic. 2018. “Improving social skills in children with ASD using a long-term, in-home social robot.” Science Robotics 3 (21): https://doi.org/10.1126/scirobotics.aat7544.

Trovato, Gabriele, Massimiliano Zecca, Salvatore Sessa, Lorenzo Jamone, Jaap Ham, Kenji Hashimoto, and Atsuo Takanishi. 2013. “Cross-cultural study on human-robot greeting interaction: acceptance and discomfort by Egyptians and Japanese.” Paladyn, Journal of Behavioral Robotics 4 (2): 83–93, https://doi.org/10.2478/pjbr-2013-0006.

Vaussard, Florian, Michael Bonani, Philippe R´etornaz, Alcherio Martinoli, and Francesco Mondada. 2011. “Towards autonomous energy-wise RObjects.” In Towards Autonomous Robotic Systems: Proceedings of the 12th Conference Towards Autonomous Robotic Systems (Berlin: Springer), 311–322.

Venkatesh, Alladi. 1985. “A Conceptualization of the Household/Technology.” Advances in Consumer Research 12: 189–194.

Venkatesh, Alladi. 1996. “Computers and Other Interactive Technologies for the Home.” Communications of the ACM, 39 (12): 4–54.

Venture, Gentiane, and Dana Kuli´c. 2019. “Robot expressive motions: a survey of generation and evaluation methods.” ACM Transactions on Human-Robot Interaction 8 (4): 1–17.

Verhagen, Tibert, Bart Van Den Hooff, and Selmar Meents. 2015. “Toward a better use of the semantic differential in IS research: An integrative framework of suggested action.” Journal of the Association for Information Systems 16 (2):1.

to cite this article

This article is using Chicago format for its references

Deuff, Dominique, Gentiane Venture, Isabelle Milleville-Pennel, and Ioana Ocnarescu. 2023. “Yōkobo, an object of sensitive presence.” .able journal: https://able-journal.org/yokobo

discover on social media

Use the links below to share a suitable version of this contribution on social media:

Instagram  Twitter  Facebook  Linkedin

discover other articles